Markov process

noun, Statistics.
1.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Also, Markoff process.
Origin
1935-40; after Russian mathematician Andreĭ Andreevich Markov (1856-1922), who developed it
Markov process in Technology

probability, simulation
A process in which the sequence of events can be described by a Markov chain.
(1995-02-23)